Web Survey Bibliography
Title Respondents of a follow-up web-based survey
Author Stoddard, S. A.; Amparo, P.; Popick, H.; Yudd, R.; Sujeer, A.; Baath, M.
Year 2016
Access date 13.03.2016
Abstract Background: The use of web-based surveys has increased in recent years, due to an increase in internet access and lower cost. However, studies have found disproportional response to web surveys from male, younger, and better educated adults. Household internet access is higher in Santa Clara County (SCC), California, located in Silicon Valley, than the national figure (87% versus 73%). We conducted a web-based follow-up survey to a random-digit-dial (RDD) telephone survey in SCC, to assess whether demographic biases between web and RDD telephone respondents were still evident in an area with high internet access. Methods: Data are from a survey of adults conducted in 2013-14. The web component was sent to eligible telephone survey respondents (completed the phone survey in English, reported recent internet use, agreed to be re-contacted, and provided a valid email address). Both surveys were conducted by Westat, a research firm headquartered in Rockville, MD. Among 4,186 telephone respondents, 1,176 were eligible and invited to take the web survey (web survey response rate = 41.0%). We compared characteristics of respondents eligible for and completing the web survey (N=482), relative to the telephone survey. Results: Relative to telephone survey respondents, those eligible for the web survey were more likely to be male (47% versus 41%); non-Hispanic White (67% versus 58%), college graduates (68% versus 45%), have household incomes of $75,000 or more (57% versus 45%), US born (73% versus 65%), and younger (mean age, 55 versus 59). Web respondents did not differ substantially by gender from telephone respondents (43% versus 41% for males), but were even more likely to be White (82% versus 58%), college graduates (75% versus 45%), have household incomes of incomes of $75,000 or more (66% versus 45%), and US born (84% versus 65%). Unlike for eligibility, the mean age was similar (both age 59). Conclusion: Unlike previous studies, we found that web survey respondents in an area with high internet access were only slightly more likely to be male but they were of similar age to RDD telephone survey respondents, even though these groups were more likely to be eligible. However, web survey respondents were more likely to be White, well-educated, higher income, and US born. These biases were even more pronounced for those who completed the web survey versus those who were eligible.
Access/Direct link Conference - homepage (Abstract)
Year of publication2015
Bibliographic typeConferences, workshops, tutorials, presentations
Web survey bibliography - Measurement (1822)
- Forecasting proportional representation elections from non-representative expectation surveys; 2016; Graefe, A.
- Short and Sweet? Length and Informative Content of Open-Ended Responses Using SMS as a Research Mode; 2016; Walsh, E.; Brinker, J. K.
- Adaptive survey designs to minimize survey mode effects – a case study on the Dutch Labor Force...; 2016; Calinescu, M.; Schouten, B.
- Mixing modes of data collection in Swiss social surveys: Methodological report of the LIVES-FORS mixed...; 2016; Roberts, C.; Joye, D.; Staehli, M. E.
- What is the gain in a probability-based online panel to provide Internet access to sampling units that...; 2016; Revilla, M.; Cornilleau, A.; Cousteaux, A-S.; Legleye, S; de Pedraza, P.
- Assessing targeted approach letters: effects in different modes on response rates, response speed and...; 2016; Lynn, P.
- New Generation of Online Questionnaires?; 2016; Revilla, M.; Ochoa, C.; Turbina, A.
- The Analysis of Respondent’s Behavior toward Edit Messages in a Web Survey; 2016; Park, Y.
- Comparing online and telephone survey results in the context of a skin cancer prevention campaign evaluation...; 2016; Hollier, L.P.; Pettigrew, S.; Slevin, T.; Strickland, M.; Minto, C.
- Evaluating Online Labor Markets for Experimental Research: Amazon.com's Mechanical Turk; 2016; Berinsky, A.; Huber, G. A.; Lenz, G. S.
- Setting Up an Online Panel Representative of the General Population The German Internet Panel; 2016; Blom, A. G.; Gathmann, C.; Krieger, U.
- Report of the Inquiry into the 2015 British general election opinion polls; 2016; Sturgis, P., Baker, N., Callegaro, M., Fisher, St., Green, J., Jennings, W., Kuha, J., Lauderdale, B...
- Effects of Data Collection Mode and Response Entry Device on Survey Response Quality; 2016; Ha, L.; Zhang, Che.; Jiang, W.
- The Dynamic Identity Fusion Index: A New Continuous Measure of Identity Fusion for Web-Based Questionnaires...; 2016; Jimenez, J.; Gomez, A.; Buhrmester, M.; Whitehouse, H.; Swann, W. B.
- Recommended Practices for the design of business surveys questionnaires; 2016; Macchia, S.
- Web-based versus Paper-based Survey Data: An Estimation of Road Users’ Value of Travel Time Savings...; 2016; Kato, H.; Sakashita, A.; Tsuchiya, Tak.
- Respondents of a follow-up web-based survey; 2016; Stoddard, S. A.; Amparo, P.; Popick, H.; Yudd, R.; Sujeer, A.; Baath, M.
- An Examination of Opposing Responses on Duplicated Multi-Mode Survey Responses; 2016; Djangali, A.
- Reducing Underreports of Behaviors in Retrospective Surveys: The Effects of Three Different Strategies...; 2016; Lugtig, P. J.; Glasner, T.; Boeve, A.
- Participant recruitment and data collection through Facebook: the role of personality factors; 2016; Rife, S. C.; Cate, K. L.; Kosinski, M.; Stillwell, D.
- Quantifying Under- and Overreporting in Surveys Through a Dual-Questioning-Technique Design. ; 2016; de Jong , M.; Fox, J.-P.; Steenkamp, J. - B. E. M.
- The use and positioning of clarification features in web surveys; 2016; Metzler, A., Kunz, T., Fuchs, M.
- Online Surveys are Mixed-Device Surveys. Issues Associated with the Use of Different (Mobile) Devices...; 2016; Toepoel, V.; Lugtig, P. J.
- Electronic and paper based data collection methods in library and information science research: A comparative...; 2016; Tella, A.
- A Technical Guide to Effective and Accessible web Surveys; 2016; Baatard, G.
- The Validity of Surveys: Online and Offline; 2016; Wiersma, W.
- Methods can matter: Where Web surveys produce different results than phone interviews; 2016; Keeter, S.
- Computer-assisted and online data collection in general population surveys; 2016; Skarupova, K.
- Sunday shopping – The case of three surveys; 2016; Bethlehem, J.
- Creation and Usability Testing of a Web-Based Pre-Scanning Radiology Patient Safety and History Questionnaire...; 2016; Robinson, T. J.; DuVall, S.; Wiggins III, R
- Comprehension and engagement in survey interviews with virtual agents; 2016; Conrad, F. G.; Schober, M. F.; Jans, M.; Orlowski, R. A.; Nielsen, D.; Levenstein, R. M.
- Improving social media measurement in surveys: Avoiding acquiescence bias in Facebook research; 2016; Kuru, O.; Pasek, J.
- Psychological research in the internet age: The quality of web-based data; 2016; Ramsey, S. R.; Thompson, K. L.; McKenzie, M.; Rosenbaum, A.
- Moderators of Candidate Name-Order Effects in Elections: An Experiment; 2016; Kim, Nu.; Krosnick, J. A.; Casasanto, D.
- Measuring Generalized Trust: An Examination of Question Wording and the Number of Scale Points; 2016; Lundmark, S.; Giljam, M.; Dahlberg, S.
- Equivalence of paper-and-pencil and computerized self-report surveys in older adults; 2016; Weigold, A.; Weigold, I. K.; Drakeford, M. K.; Dykema, S. A.; Smith, C. A.
- Quality of Different Scales in an Online Survey in Mexico and Colombia; 2016; Revilla, M.; Ochoa, C.
- A multi-group analysis of online survey respondent data quality: Comparing a regular USA consumer panel...; 2016; Golden, L.; Albaum, G.; Roster, C. A.; Smith, S. M.
- Does the Inclusion of Non-Internet Households in a Web Panel Reduce Coverage Bias?; 2016; Eckman, S.
- Investigating respondent multitasking in web surveys using paradata; 2016; Sendelbah, A.; Vehovar, V.; Slavec, A.; Petrovcic, A.
- Respondent Conditioning in Online Panel Surveys: Results of Two Field Experiments; 2016; Struminskaya, B.
- Swapping bricks for clicks: Crowdsourcing longitudinal data on Amazon Turk; 2016; Daly, T. M.; Nataraajan, R.
- A reliability analysis of Mechanical Turk data; 2016; Rouse, S. V.
- Quota Controls in Survey Research.; 2016; Gittelman, S. H.; Thomas, R. K.; Lavrakas, P. J.; Lange, V.
- Presentation matters: how mode effects in item non-response depend on the presentation of response options...; 2016; Zeglovits, E.; Schwarzer, S.
- Internet-administered Health-related Quality of Life Questionnaires Compared With Pen and Paper in an...; 2016; Nitikman, M.; Mulpuri, K.; Reilly, C. W.
- Scientific Surveys Based on Incomplete Sampling Frames and High Rates of Nonresponse; 2016; Fahimi, M.; Barlas, F. M.; Thomas, R. K.; Buttermore, N. R.
- Doing Surveys Online ; 2016; Toepoel, V.
- Exploring Factors in Contributing Student Progress in the Open University; 2016; Arifin, M. H.
- Are Fast Responses More Random? Testing the Effect of Response Time on Scale in an Online Choice Experiment...; 2015; Boerger, T.